111 research outputs found

    Expressive haptics for enhanced usability of mobile interfaces in situations of impairments

    Get PDF
    Designing for situational awareness could lead to better solutions for disabled people, likewise, exploring the needs of disabled people could lead to innovations that can address situational impairments. This in turn can create non-stigmatising assistive technology for disabled people from which eventually everyone could benefit. In this paper, we investigate the potential for advanced haptics to compliment the graphical user interface of mobile devices, thereby enhancing user experiences of all people in some situations (e.g. sunlight interfering with interaction) and visually impaired people. We explore technical solutions to this problem space and demonstrate our justification for a focus on the creation of kinaesthetic force feedback. We propose initial design concepts and studies, with a view to co-create delightful and expressive haptic interactions with potential users motivated by scenarios of situational and permanent impairments.Comment: Presented at the CHI'19 Workshop: Addressing the Challenges of Situationally-Induced Impairments and Disabilities in Mobile Interaction, 2019 (arXiv:1904.05382

    CollaborationBus: An Editor for the Easy Configuration of Complex Ubiquitous Environment

    Get PDF
    Early sensor-based infrastructures were often developed by experts with a thorough knowledge of base technology for sensing information, for processing the captured data, and for adapting the system’s behaviour accordingly. In this paper we argue that also end-users should be able to configure Ubiquitous Computing environments. We introduce the CollaborationBus application: a graphical editor that provides abstractions from base technology and thereby allows multifarious users to configure Ubiquitous Computing environments. By composing pipelines users can easily specify the information flows from selected sensors via optional filters for processing the sensor data to actuators changing the system behaviour according to the users’ wishes. Users can compose pipelines for both home and work environments. An integrated sharing mechanism allows them to share their own compositions, and to reuse and build upon others’ compositions. Real-time visualisations help them understand how the information flows through their pipelines. In this paper we present the concept, implementation, and early user feedback of the CollaborationBus application

    EagleSense:tracking people and devices in interactive spaces using real-time top-view depth-sensing

    Get PDF
    Real-time tracking of people's location, orientation and activities is increasingly important for designing novel ubiquitous computing applications. Top-view camera-based tracking avoids occlusion when tracking people while collaborating, but often requires complex tracking systems and advanced computer vision algorithms. To facilitate the prototyping of ubiquitous computing applications for interactive spaces, we developed EagleSense, a real-time human posture and activity recognition system with a single top-view depth sensing camera. We contribute our novel algorithm and processing pipeline, including details for calculating silhouetteextremities features and applying gradient tree boosting classifiers for activity recognition optimised for top-view depth sensing. EagleSense provides easy access to the real-time tracking data and includes tools for facilitating the integration into custom applications. We report the results of a technical evaluation with 12 participants and demonstrate the capabilities of EagleSense with application case studies

    CurationSpace:Cross-Device Content Curation Using Instrumental Interaction

    Get PDF
    For digital content curation of historical artefacts, curators collaboratively collect, analyze and edit documents, images, and other digital resources in order to display and share new representations of that information to an audience. Despite their increasing reliance on digital documents and tools, current technologies provide little support for these specific collaborative content curation activities. We introduce CurationSpace - a novel cross-device system - to provide more expressive tools for curating and composing digital historical artefacts. Based on the concept of Instrumental Interaction, CurationSpace allows users to interact with digital curation artefacts on shared interactive surfaces using personal smartwatches as selectors for instruments or modifiers (applied to either the whole curation space, individual documents, or fragments). We introduce a range of novel interaction techniques that allow individuals or groups of curators to more easily create, navigate and share resources during content curation. We report insights from our user study about people's use of instruments and modifiers for curation activities

    EvalMe:Exploring the value of new technologies for in situ evaluation of learning experiences

    Get PDF
    Tangible interfaces have much potential for engendering shared interaction and reflection, as well as for promoting playful experiences. How can their properties be capitalised on to enable students to reflect on their learning, both individually and together, throughout learning sessions? This Research through Design paper describes our development of EvalMe, a flexible, tangible tool aimed at being playful, enjoyable to use and enabling children to reflect on their learning, both in the moment and after a learning session has ended. We discuss the insights gained through the process of designing EvalMe, co-defining its functionality with two groups of collaborators and deploying it in two workshop settings. Through this process, we map key contextual considerations for the design of technologies for in situ evaluation of learning experiences. Finally, we discuss how tangible evaluation technologies deployed throughout a learning session, can positively contribute to students’ reflection about their learning

    HCITools:strategies and best practices for designing, evaluating and sharing technical HCI toolkits

    Get PDF
    Over the years, toolkits have been designed to facilitate the rapid prototyping of novel designs for graphical user interfaces, physical computing, fabrication, tangible interfaces and ubiquitous computing. However, although evaluation methods for HCI are widely available, particular techniques and approaches to evaluate technical toolkit research are less well developed. Moreover, it is unclear what kind of contribution and impact technical toolkits can bring to the larger HCI community. In this workshop we aim to bring together leading researchers in the field to discuss challenges and opportunities to develop new methods and approaches to design, evaluate, disseminate and share toolkits. Furthermore, we will discuss the technical, methodological and enabling role of toolkits for HCI research

    CollaborationBus: An Editor for the Easy Configuration of Complex Ubiquitous Environment

    Get PDF
    Early sensor-based infrastructures were often developed by experts with a thorough knowledge of base technology for sensing information, for processing the captured data, and for adapting the system’s behaviour accordingly. In this paper we argue that also end-users should be able to configure Ubiquitous Computing environments. We introduce the CollaborationBus application: a graphical editor that provides abstractions from base technology and thereby allows multifarious users to configure Ubiquitous Computing environments. By composing pipelines users can easily specify the information flows from selected sensors via optional filters for processing the sensor data to actuators changing the system behaviour according to the users’ wishes. Users can compose pipelines for both home and work environments. An integrated sharing mechanism allows them to share their own compositions, and to reuse and build upon others’ compositions. Real-time visualisations help them understand how the information flows through their pipelines. In this paper we present the concept, implementation, and early user feedback of the CollaborationBus application

    Robust tracking of respiratory rate in high-dynamic range scenes using mobile thermal imaging

    Get PDF
    The ability to monitor respiratory rate is extremely important for medical treatment, healthcare and fitness sectors. In many situations, mobile methods, which allow users to undertake every day activities, are required. However, current monitoring systems can be obtrusive, requiring users to wear respiration belts or nasal probes. Recent advances in thermographic systems have shrunk their size, weight and cost, to the point where it is possible to create smart-phone based respiration rate monitoring devices that are not affected by lighting conditions. However, mobile thermal imaging is challenged in scenes with high thermal dynamic ranges. This challenge is further amplified by general problems such as motion artifacts and low spatial resolution, leading to unreliable breathing signals. In this paper, we propose a novel and robust approach for respiration tracking which compensates for the negative effects of variations in the ambient temperature and motion artifacts and can accurately extract breathing rates in highly dynamic thermal scenes. It has three main contributions. The first is a novel Optimal Quantization technique which adaptively constructs a color mapping of absolute temperature to improve segmentation, classification and tracking. The second is the Thermal Gradient Flow method that computes thermal gradient magnitude maps to enhance accuracy of the nostril region tracking. Finally, we introduce the Thermal Voxel method to increase the reliability of the captured respiration signals compared to the traditional averaging method. We demonstrate the extreme robustness of our system to track the nostril-region and measure the respiratory rate in high dynamic range scenes.Comment: Vol. 8, No. 10, 1 Oct 2017, Biomedical Optics Express 4480 - Full abstract can be found in this journal article (due to limited word counts of 'arXiv abstract'
    • …
    corecore